Maximum Entropy Approximation of Small Sample Distributions
نویسندگان
چکیده
In this paper, we propose a simple approach for approximating small sample distributions based on the method of Maximum Entropy (maxent) density. Unlike previous studies that use the maxent method as a curve-fitting tool, we focus on the “small sample asymptotics” of the approximation. We show that this method obtains the same asymptotic order of accuracy as that of the classical Edgeworth expansion and the exponential Edgeworth expansion by Field and Hampel (1982). In addition, the maxent approximation attains the minimum Kullback-Leibler distance among all approximations of the same exponential family. For symmetric fat-tailed distributions, we adopt the dampening approach in Wang (1992) to ensure the integrability of maxent densities and effectively extend the maxent method to accommodate all admissible skewness and kurtosis space. Our numerical experiments show that the proposed method compares competitively with and often outperforms the Edgeworth and exponential Edgeworth expansions, especially when the sample size is small.
منابع مشابه
Information Theoretic Asymptotic Approximations for Distributions of Statistics
We propose an information theoretic approach to approximating asymptotic distributions of statistics using the maximum entropy densities. Conventional maximum entropy densities are typically defined on a bounded support. For distributions defined on unbounded supports, we propose to use an asymptotically negligible dampening function for the maximum entropy approximation such that it is well de...
متن کاملDetermination of Maximum Bayesian Entropy Probability Distribution
In this paper, we consider the determination methods of maximum entropy multivariate distributions with given prior under the constraints, that the marginal distributions or the marginals and covariance matrix are prescribed. Next, some numerical solutions are considered for the cases of unavailable closed form of solutions. Finally, these methods are illustrated via some numerical examples.
متن کاملA Note on the Bivariate Maximum Entropy Modeling
Let X=(X1 ,X2 ) be a continuous random vector. Under the assumption that the marginal distributions of X1 and X2 are given, we develop models for vector X when there is partial information about the dependence structure between X1 and X2. The models which are obtained based on well-known Principle of Maximum Entropy are called the maximum entropy (ME) mo...
متن کاملEntropy and the ‘compound’ Law of Small Numbers
An information-theoretic foundation for compound Poisson approximation limit theorems is presented, in analogy to the corresponding developments for the central limit theorem and for simple Poisson approximation. It is shown that the compound Poisson distributions satisfy a natural maximum entropy property within a natural class of distributions. Simple compound Poisson approximation bounds are...
متن کاملLarge-Sample Asymptotic Approximations for the Sampling and Posterior Distributions of Differential Entropy for Multivariate Normal Distributions
In the present paper, we propose a large sample asymptotic approximation for the sampling and posterior distributions of differential entropy when the sample is composed of independent and identically distributed realization of a multivariate normal distribution.
متن کامل